Tensor Decompositions and Sparse Log-linear Models.

نویسندگان

  • James E Johndrow
  • Anirban Bhattacharya
  • David B Dunson
چکیده

Contingency table analysis routinely relies on log-linear models, with latent structure analysis providing a common alternative. Latent structure models lead to a reduced rank tensor factorization of the probability mass function for multivariate categorical data, while log-linear models achieve dimensionality reduction through sparsity. Little is known about the relationship between these notions of dimensionality reduction in the two paradigms. We derive several results relating the support of a log-linear model to nonnegative ranks of the associated probability tensor. Motivated by these findings, we propose a new collapsed Tucker class of tensor decompositions, which bridge existing PARAFAC and Tucker decompositions, providing a more flexible framework for parsimoniously characterizing multivariate categorical data. Taking a Bayesian approach to inference, we illustrate empirical advantages of the new decompositions.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Tensor Decompositions: A New Concept in Brain Data Analysis?

Matrix factorizations and their extensions to tensor factorizations and decompositions have become prominent techniques for linear and multilinear blind source separation (BSS), especially multiway Independent Component Analysis (ICA), Nonnegative Matrix and Tensor Factorization (NMF/NTF), Smooth Component Analysis (SmoCA) and Sparse Component Analysis (SCA). Moreover, tensor decompositions hav...

متن کامل

Sparse Higher-Order Principal Components Analysis

Traditional tensor decompositions such as the CANDECOMP / PARAFAC (CP) and Tucker decompositions yield higher-order principal components that have been used to understand tensor data in areas such as neuroimaging, microscopy, chemometrics, and remote sensing. Sparsity in high-dimensional matrix factorizations and principal components has been well-studied exhibiting many benefits; less attentio...

متن کامل

ParCube: Sparse Parallelizable Tensor Decompositions

How can we efficiently decompose a tensor into sparse factors, when the data does not fit in memory? Tensor decompositions have gained a steadily increasing popularity in data mining applications, however the current state-of-art decomposition algorithms operate on main memory and do not scale to truly large datasets. In this work, we propose ParCube, a new and highly parallelizable method for ...

متن کامل

Tensor-structured methods for parameter dependent and stochastic elliptic PDEs

Modern methods of tensor-product decomposition allow an efficient data-sparse approximation of functions and operators in higher dimensions [5]. The recent quantics-TT (QTT) tensor method allows to represent the multidimensional data with log-volume complexity [1, 2, 3]. We discuss the convergence rate of the Tucker, canonical and QTT stochastic collocation tensor approximations to the solution...

متن کامل

Shape Constrained Tensor Decompositions using Sparse Representations in Over-Complete Libraries

We consider N -way data arrays and low-rank tensor factorizations where the time mode is coded as a sparse linear combination of temporal elements from an over-complete library. Our method, Shape Constrained Tensor Decomposition (SCTD) is based upon the CANDECOMP/PARAFAC (CP) decomposition which produces r-rank approximations of data tensors via outer products of vectors in each dimension of th...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Annals of statistics

دوره 45 1  شماره 

صفحات  -

تاریخ انتشار 2017